236779: Foundations of Algorithms for Massive Datasets Lecture 4 the Johnson-lindenstrauss Lemma

نویسنده

  • Charles Sutton
چکیده

The Johnson-Lindenstrauss lemma and its proof This lecture aims to prove the Johnson–Lindenstrauss lemma. Since the lemma is proved easily with another interesting lemma, a part of this lecture is focused on the proof of this second lemma. At the end, the optimality of the Johnson–Lindenstrauss lemma is discussed. Lemma 1 (Johnson-Lindenstrauss). Given the initial space X ⊆ R n s.t. |X| = N , < 1/4 the distortion parameter, δ the probability of failure, and K < n the target dimension. ∃Φ ∈ R K×n a random matrix such that, with probability 1 − δ : and K = O log N δ 2 The following lemma is relevant to prove the Johnson-Lindenstrauss lemma.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Geometric Optimization April 12 , 2007 Lecture 25 : Johnson Lindenstrauss Lemma

The topic of this lecture is dimensionality reduction. Many problems have been efficiently solved in low dimensions, but very often the solution to low-dimensional spaces are impractical for high dimensional spaces because either space or running time is exponential in dimension. In order to address the curse of dimensionality, one technique is to map a set of points in a high dimensional space...

متن کامل

236779 : Foundations of Algorithms for Massive Datasets Nov 11 2015 Lecture

These notes cover the end of the Frequent-items (Batch-Decrement) sketch, the Count-Min sketch, the F2 Tug-of-War sketch (AMS), and initial background for dimensionality reduction and the Johnson-Lindenstrauss transform. 1 Reminder: Frequency Moments We are given a stream (sequence) of N characters (or items) a1, a2, . . . , aN from a large alphabet Σ of size |Σ| = n. Definition 1. A histogram ...

متن کامل

Lecture 6 : Johnson - Lindenstrauss Lemma : Dimension Reduction

Observer that for any three points, if the three distances between them are given, then the three angles are fixed. Given n−1 vectors, the vectors together with the origin form a set of n points. In fact, given any n points in Euclidean space (in n−1 dimensions), the Johnson-Lindenstrauss Lemma states that the n points can be placed in O( logn 2 ) dimensions such that distances are preserved wi...

متن کامل

Johnson-Lindenstrauss lemma for circulant matrices

The original proof of Johnson and Lindenstrauss [11] uses (up to a scaling factor) an orthogonal projection onto a random k-dimensional subspace of Rd. We refer also to [7] for a beautiful and selfcontained proof. Later on, this lemma found many applications, especially in design of algorithms, where it sometimes allows to reduce the dimension of the underlying problem essentially and break the...

متن کامل

Homework 2 Foundations of Algorithms for Massive Datasets ( 236779 ) Fall 2015

Small coherence (or, incoherence) has nice properties as we shall see now. Assume A has coherence μ < 1. Prove that for all s < 1/μ, A has RIP with parameters s, δ = sμ. Hint: For s-sparse x, write down ‖Ax‖ = (Ax)Ax using inner products of columns of A. Note: There exist deterministic constructions of k×d matrices with incoherence ∼ 1/ √ k, which by the above imply RIP with parameters s = √ k,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015